The Most Robust Loss Function for Boosting
نویسندگان
چکیده
Boosting algorithm is understood as the gradient descent algorithm of a loss function. It is often pointed out that the typical boosting algorithm, Adaboost, is seriously affected by the outliers. In this paper, loss functions for robust boosting are studied. Based on a concept of the robust statistics, we propose a positive-part-truncation of the loss function which makes the boosting algorithm robust against extreme outliers. Numerical experiments show that the proposed boosting algorithm is useful for highly noisy data in comparison with other competitors.
منابع مشابه
Bayes, E-Bayes and Robust Bayes Premium Estimation and Prediction under the Squared Log Error Loss Function
In risk analysis based on Bayesian framework, premium calculation requires specification of a prior distribution for the risk parameter in the heterogeneous portfolio. When the prior knowledge is vague, the E-Bayesian and robust Bayesian analysis can be used to handle the uncertainty in specifying the prior distribution by considering a class of priors instead of a single prior. In th...
متن کاملRobust Loss Functions for Boosting
Boosting is known as a gradient descent algorithm over loss functions. It is often pointed out that the typical boosting algorithm, Adaboost, is highly affected by outliers. In this letter, loss functions for robust boosting are studied. Based on the concept of robust statistics, we propose a transformation of loss functions that makes boosting algorithms robust against extreme outliers. Next, ...
متن کاملRobust Model for Networked Control System with Packet Loss
The Networked Control System in modern control widely uses to decrease the implementation cost and increasing the performance. NCS in addition to its advantages is inevitable. Nevertheless they suffer of some limitations and deficiencies. Packet loss is one of the main limitations which affect the control system in different conditions and finally may lead to system instability. For this reason...
متن کاملSPLBoost: An Improved Robust Boosting Algorithm Based on Self-paced Learning
It is known that Boosting can be interpreted as a gradient descent technique to minimize an underlying loss function. Specifically, the underlying loss being minimized by the traditional AdaBoost is the exponential loss, which is proved to be very sensitive to random noise/outliers. Therefore, several Boosting algorithms, e.g., LogitBoost and SavageBoost, have been proposed to improve the robus...
متن کاملBoosting in the presence of outliers: adaptive classification with non-convex loss functions
This paper examines the role and efficiency of the non-convex loss functions for binary classification problems. In particular, we investigate how to design a simple and effective boosting algorithm that is robust to the outliers in the data. The analysis of the role of a particular non-convex loss for prediction accuracy varies depending on the diminishing tail properties of the gradient of th...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2004